The Hopfield model is a pioneering neural network model with associativememory retrieval. The analytical solution of the model in mean field limitrevealed that memories can be retrieved without any error up to a finitestorage capacity of $O(N)$, where $N$ is the system size. Beyond the threshold,they are completely lost. Since the introduction of the Hopfield model, thetheory of neural networks has been further developed toward realistic neuralnetworks using analog neurons, spiking neurons, etc. Nevertheless, thoseadvances are based on fully connected networks, which are inconsistent withrecent experimental discovery that the number of connections of each neuronseems to be heterogeneous, following a heavy-tailed distribution. Motivated bythis observation, we consider the Hopfield model on scale-free networks andobtain a different pattern of associative memory retrieval from that obtainedon the fully connected network: the storage capacity becomes tremendouslyenhanced but with some error in the memory retrieval, which appears as theheterogeneity of the connections is increased. Moreover, the error rates arealso obtained on several real neural networks and are indeed similar to that onscale-free model networks.
展开▼
机译:Hopfield模型是具有关联内存检索功能的开创性神经网络模型。该模型在平均场限制中的解析解表明,可以检索到没有错误的内存,直到$ O(N)$的有限存储容量,其中$ N $是系统大小。超出阈值,它们将完全丢失。自从Hopfield模型引入以来,神经网络的理论已朝着使用模拟神经元,尖峰神经元等的逼真的神经网络发展。然而,这些先进技术基于完全连接的网络,这与最近的实验发现不一致,即连接的数量每个神经元似乎是异质的,遵循着重尾分布。基于此观察结果,我们考虑了无标度网络上的Hopfield模型,并获得了与全连接网络上不同的关联内存检索模式:存储容量大大提高,但内存检索存在一些错误,表现为存储的异质性。连接增加。此外,错误率还可以在几个真实的神经网络上获得,并且确实与无标度模型网络相似。
展开▼